21 research outputs found

    Experimental Characterization of Combustion Instabilities and Flow-Flame Dynamics in a Partially-Premixed Gas Turbine Model Combustor.

    Full text link
    Partially-premixed, swirl combustion is applied in gas turbine combustors to achieve flame stabilization and reduced emission production. However, this method is also inherently sensitive to combustion instabilities which can cause large pressure, velocity, and heat release fluctuations. This thesis investigates thermoacoustic coupling created by flow-flame dynamics in a gas turbine model combustor (GTMC) for a variety of fuels and operating flow rates. Several naturally occurring instability modes were identified to control the acoustic response of the system, including Helmholtz resonances from the plenum and convective-acoustic effects which cause equivalence ratio oscillations. Laser Doppler velocimetry was used to measure radial flow in the GTMC, which can set up flow-fields which create loudly resonating flat-shaped flames, in comparison to quiet V-shaped flames. Flame location and shape altered convective time delays which determine the relative phases of pressure and heat release oscillations. Simultaneous pressure and chemiluminescence imaging showed that the heat release, pressure fluctuations, and flame motion are all coupled at the same instability frequency. Videos of the flame motion also revealed that the precessing vortex core (PVC), created by the swirling flow, influences the rocking behavior of the flame. Acetone was added to the fuel to act as a tracer in fluorescence measurements which indicated the localization of unburned fuel. It was discovered that fuel was distributed in lobes which corresponded to locations surrounding the shear layer outside of the central recirculation zone, and that the relative distribution of the lobes adjusted to forcing by the flow. Finally, high-speed formaldehyde planar laser-induced fluorescence was applied to study the motion of preheat zone surfaces in response to the oscillations of the instability. The flame surface density and wrinkling fluctuated at the acoustic frequency and displayed dampened motions correlated with the PVC precession. In non-resonating flames, the behavior of the formaldehyde structure and marked flame surfaces were dominated by the PVC motion, but the degree of surface area fluctuations was reduced compared to unstable flames. Instabilities in the GTMC are driven by a complex combination of thermoacoustic and flow-field couplings which are influenced by the operational conditions, fueling, mixing, and convective time delays.PhDAerospace EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/102385/1/pallison_1.pd

    Obeticholic acid for the treatment of non-alcoholic steatohepatitis: interim analysis from a multicentre, randomised, placebo-controlled phase 3 trial

    Get PDF
    Background Non-alcoholic steatohepatitis (NASH) is a common type of chronic liver disease that can lead to cirrhosis. Obeticholic acid, a farnesoid X receptor agonist, has been shown to improve the histological features of NASH. Here we report results from a planned interim analysis of an ongoing, phase 3 study of obeticholic acid for NASH. Methods In this multicentre, randomised, double-blind, placebo-controlled study, adult patients with definite NASH,non-alcoholic fatty liver disease (NAFLD) activity score of at least 4, and fibrosis stages F2–F3, or F1 with at least oneaccompanying comorbidity, were randomly assigned using an interactive web response system in a 1:1:1 ratio to receive oral placebo, obeticholic acid 10 mg, or obeticholic acid 25 mg daily. Patients were excluded if cirrhosis, other chronic liver disease, elevated alcohol consumption, or confounding conditions were present. The primary endpointsfor the month-18 interim analysis were fibrosis improvement (≥1 stage) with no worsening of NASH, or NASH resolution with no worsening of fibrosis, with the study considered successful if either primary endpoint was met. Primary analyses were done by intention to treat, in patients with fibrosis stage F2–F3 who received at least one dose of treatment and reached, or would have reached, the month 18 visit by the prespecified interim analysis cutoff date. The study also evaluated other histological and biochemical markers of NASH and fibrosis, and safety. This study is ongoing, and registered with ClinicalTrials.gov, NCT02548351, and EudraCT, 20150-025601-6. Findings Between Dec 9, 2015, and Oct 26, 2018, 1968 patients with stage F1–F3 fibrosis were enrolled and received at least one dose of study treatment; 931 patients with stage F2–F3 fibrosis were included in the primary analysis (311 in the placebo group, 312 in the obeticholic acid 10 mg group, and 308 in the obeticholic acid 25 mg group). The fibrosis improvement endpoint was achieved by 37 (12%) patients in the placebo group, 55 (18%) in the obeticholic acid 10 mg group (p=0·045), and 71 (23%) in the obeticholic acid 25 mg group (p=0·0002). The NASH resolution endpoint was not met (25 [8%] patients in the placebo group, 35 [11%] in the obeticholic acid 10 mg group [p=0·18], and 36 [12%] in the obeticholic acid 25 mg group [p=0·13]). In the safety population (1968 patients with fibrosis stages F1–F3), the most common adverse event was pruritus (123 [19%] in the placebo group, 183 [28%] in the obeticholic acid 10 mg group, and 336 [51%] in the obeticholic acid 25 mg group); incidence was generally mild to moderate in severity. The overall safety profile was similar to that in previous studies, and incidence of serious adverse events was similar across treatment groups (75 [11%] patients in the placebo group, 72 [11%] in the obeticholic acid 10 mg group, and 93 [14%] in the obeticholic acid 25 mg group). Interpretation Obeticholic acid 25 mg significantly improved fibrosis and key components of NASH disease activity among patients with NASH. The results from this planned interim analysis show clinically significant histological improvement that is reasonably likely to predict clinical benefit. This study is ongoing to assess clinical outcomes

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems that facilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment.This document describes the conceptual design for the Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE). The goals of the experiment include 1) studying neutrino oscillations using a beam of neutrinos sent from Fermilab in Illinois to the Sanford Underground Research Facility (SURF) in Lead, South Dakota, 2) studying astrophysical neutrino sources and rare processes and 3) understanding the physics of neutrino interactions in matter. We describe the development of the computing infrastructure needed to achieve the physics goals of the experiment by storing, cataloging, reconstructing, simulating, and analyzing \sim 30 PB of data/year from DUNE and its prototypes. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions and advanced algorithms as HEP computing evolves. We describe the physics objectives, organization, use cases, and proposed technical solutions

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Separation of track- and shower-like energy deposits in ProtoDUNE-SP using a convolutional neural network

    No full text
    International audienceLiquid argon time projection chamber detector technology provides high spatial and calorimetric resolutions on the charged particles traversing liquid argon. As a result, the technology has been used in a number of recent neutrino experiments, and is the technology of choice for the Deep Underground Neutrino Experiment (DUNE). In order to perform high precision measurements of neutrinos in the detector, final state particles need to be effectively identified, and their energy accurately reconstructed. This article proposes an algorithm based on a convolutional neural network to perform the classification of energy deposits and reconstructed particles as track-like or arising from electromagnetic cascades. Results from testing the algorithm on experimental data from ProtoDUNE-SP, a prototype of the DUNE far detector, are presented. The network identifies track- and shower-like particles, as well as Michel electrons, with high efficiency. The performance of the algorithm is consistent between experimental data and simulation

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Reconstruction of interactions in the ProtoDUNE-SP detector with Pandora

    No full text
    International audienceThe Pandora Software Development Kit and algorithm libraries provide pattern-recognition logic essential to the reconstruction of particle interactions in liquid argon time projection chamber detectors. Pandora is the primary event reconstruction software used at ProtoDUNE-SP, a prototype for the Deep Underground Neutrino Experiment far detector. ProtoDUNE-SP, located at CERN, is exposed to a charged-particle test beam. This paper gives an overview of the Pandora reconstruction algorithms and how they have been tailored for use at ProtoDUNE-SP. In complex events with numerous cosmic-ray and beam background particles, the simulated reconstruction and identification efficiency for triggered test-beam particles is above 80% for the majority of particle type and beam momentum combinations. Specifically, simulated 1 GeV/cc charged pions and protons are correctly reconstructed and identified with efficiencies of 86.1±0.6\pm0.6% and 84.1±0.6\pm0.6%, respectively. The efficiencies measured for test-beam data are shown to be within 5% of those predicted by the simulation

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore